perm filename CIRCUM.ABS[F79,JMC] blob sn#488060 filedate 1979-12-12 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00003 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	.<<
C00004 00003		Research in artificial intelligence involves two components
C00011 ENDMK
C⊗;
.<<
.require "memo.pub[let,jmc]" source
.>>
.device xgp
.height←52;	turn on "%α"
.  page frame height high 85 wide
.  title area heading lines 1 to 10 char 1 to 80;
.  area text lines 4 to height char 1 to 80;
.
.font 1 "micr25"; font 2 "sail25"; font 3 "quux25";
.macro cb(head) ⊂ if lines<5 then next page else skip; once center; select 3
head

.⊃
.
.cb FORMALIZATION OF OCKHAM'S RAZOR
.if false then begin

Notes:
ai and philosophy and maybe or

Plurality is not to be assumed without necessity.

What can be done with fewer [assumptions] is in vain done
with more.

Entities are not to be multiplied without necessity.

1. revise example so that it fits Ockham's razor better.

2. give circumscription rule

3. give some version of Reiter's rule.

.end
.at "⊗" word " "	⊂ ("%2word%* ") ⊃;
	Research in artificial intelligence involves two components
which may be called ⊗epistemology and ⊗heuristics.  Heuristics deals
with search of spaces of possibilities, with pattern matching, planning
and many other topics.  The epistemological part of AI, which seems
to be contiguous with some of the epistemological problems of philosophy,
is concerned with what are the general facts of the common sense world
and how they may be represented in the memory of a computer, how the
particular facts of a situation are consequences of observation or
communication and how they may be represented, and with the rules of
inference or conjecture, whereby an organism or machine may
justifiably go from premisses to conclusions.

	The work described here arises
in trying to make computers reason, i.e. draw conclusions from facts,
in common sense situations.
An early approach (McCarthy 1960), which had partial success
and whose ideas have led to other approaches, is to represent
facts about the world in general and about particular situations by
sentences of some system of mathematical logic.  First order languages
have been mainly used, because they are best understood and because
set theory, which is as powerful as any system of logic, can be formulated
as a first order theory.

	While the facts of particular situations are often conveniently
represented in first order formalisms, it eventually
became clear that not all human reasoning, and not all reasoning
that computers must do to behave intelligently, can be modelled
by the rules of inference of a system of mathematical logic.
Thus it becomes necessary to supplement the valid %2rules of inference%1
of logic with %2rules of conjecture%1.  Surprisingly, it turns out that
some of these rules of conjecture admit mathematical formalization as
precise as that of the rules of inference of logic.

	The problem is that we often draw conclusions in the absence
of certain facts that we would not draw in their presence.  This violates
a monotonicity property posessed by all systems of mathematical logic.
Symbolically ⊗monotonicity is expressed by

.begin nofill narrow 8; select 2;

A qi p
A ⊂ B
______
B qi p

.end

Here ⊗A and ⊗B are sets of sentences, ⊗p is a sentence, and
%2A_qi_p%1 is read %2"p is logically deducible from A"%1.  The point
is that if ⊗p is a logical consequence of a collection ⊗A of sentences
and ⊗B is a more inclusive collection of sentences, then ⊗p is a
logical consequence of ⊗B.  Indeed the proof of ⊗p from ⊗A will also
serve as a proof of ⊗p from ⊗B. 
Monotonicity is a characteristic of all present logical systems, and
maybe all possible logical systems that do only reasoning guaranteed
to be correct.

	That much ordinary reasoning is not monotonic is best
seen with the aid of an example.  I know that you own a car, and
I conclude that it is appropriate to ask you for a ride to the airport.
Before asking, however, I learn that your car is being repaired and
conclude that it is not appropriate.  Then I learn that you have your
wife's car, etc., etc.  No new fact contradicts the previous collection
of facts, but each new fact reverses the appropriateness of asking for a
ride.

	One can attempt to deal with this phenomenon in various ways,
and perhaps the most congenial to my preconceptions of the preconceptions
of this audience would be to introduce probabilities.  For reasons we
shall discuss later, we don't believe this meets the problem head on,
and I shall discuss two non-probabilistic approaches to non-monotonic
reasoning that have been taken by researchers in artificial intelligence.
Both approaches are discussed in articles in a forthcoming issue of the
journal %2Artificial Intelligence%1.  See references
(McDermott and Doyle 1980), (Reiter 1980)
and (McCarthy 1980).

	
	
References:

%3McCarthy, John (1960)%1: "Programs with Common Sense", %2Proceedings of
the Teddington Conference on the Mechanization of Thought Processes%1, H.M.
Stationery Office, 1960.

%3McCarthy, John (1980)%1: "Circumscription - A Form of Non-Monotonic
Reasoning", %2Artificial Intelligence%1, forthcoming.

%3McDermott, Drew and Jon Doyle (1980): "Non-Monotonic Logic I",
%2Artificial Intelligence%1, forthcoming.

%3Reiter, Raymond (1980)%1: "A Logic for Default Reasoning",
%2Artificial Intelligence%1, forthcoming.